Stochastic compositional gradient descent: algorithms for minimizing compositions of expected-value functions

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Nonlinear Functions of Expected Values

Classical stochastic gradient methods are well suited for minimizing expected-valued objective functions. However, they do not apply to the minimization of a nonlinear function involving expected values, i.e., problems of the form minx f ( Ew[gw(x)] ) . In this paper, we propose a class of stochastic compositional gradient descent (SCGD) algorithms that can be viewed as stochastic versions of q...

متن کامل

Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. Unlike SVRG, our method can be directly applied to non-strongly and strongly convex problems. We show that our method achieves a lower overall complexity than the re...

متن کامل

Supplementary Materials: Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

1 Proof of the Proposition 1 We now prove the Proposition 1 that gives the condition of compactness of sublevel set. Proof. Let B(r) and S(r) denote the ball and sphere of radius r, centered at the origin. By affine transformation, we can assume that X∗ contains the origin O, X∗ ⊂ B(1), and X∗ ∩ S(1) = φ. Then, we have that for ∀x ∈ S(1), (∇f(x), x) ≥ f(x)− f(O) > 0, where we use convexity for ...

متن کامل

Convergence Analysis of Gradient Descent Stochastic Algorithms

This paper proves convergence of a sample-path based stochastic gradient-descent algorithm for optimizing expected-value performance measures in discrete event systems. The algorithm uses increasing precision at successive iterations, and it moves against the direction of a generalized gradient of the computed sample performance function. Two convergence results are established: one, for the ca...

متن کامل

Intrinsic Geometry of Stochastic Gradient Descent Algorithms

We consider the intrinsic geometry of stochastic gradient descent (SG) algorithms. We show how to derive SG algorithms that fully respect an underlying geometry which can be induced by either prior knowledge in the form of a preferential structure or a generative model via the Fisher information metric. We show that using the geometrically motivated update and the “correct” loss function, the i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2016

ISSN: 0025-5610,1436-4646

DOI: 10.1007/s10107-016-1017-3